Orthogonal basis

Results: 27



#Item
11IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 9, NO. 8, AUGUST[removed]Sequential Karhunen–Loeve Basis Extraction and its Application to Images

IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 9, NO. 8, AUGUST[removed]Sequential Karhunen–Loeve Basis Extraction and its Application to Images

Add to Reading List

Source URL: www.cs.technion.ac.il

Language: English - Date: 2012-05-31 10:47:18
12ConstructiveFunctions2014

ConstructiveFunctions2014

Add to Reading List

Source URL: math.nist.gov

Language: English - Date: 2014-07-18 19:39:45
13Partial LLL Reduction Xiaohu Xie Xiao-Wen Chang  Mazen Al Borno

Partial LLL Reduction Xiaohu Xie Xiao-Wen Chang Mazen Al Borno

Add to Reading List

Source URL: www.cs.mcgill.ca

Language: English - Date: 2011-08-29 23:51:55
14Microsoft Word - nl214.docx

Microsoft Word - nl214.docx

Add to Reading List

Source URL: math.nist.gov

Language: English - Date: 2014-07-18 16:56:37
15Adaptive Control with Multiresolution Bases Christophe P. Bernard Centre de Math´ematiques Appliqu´ees Ecole Polytechnique[removed]Palaiseau cedex, FRANCE [removed]

Adaptive Control with Multiresolution Bases Christophe P. Bernard Centre de Math´ematiques Appliqu´ees Ecole Polytechnique[removed]Palaiseau cedex, FRANCE [removed]

Add to Reading List

Source URL: web.mit.edu

Language: English - Date: 2005-05-16 14:06:15
16Eigenvectors and Diagonalizing Matrices E. L. Lady Let A be an n × n matrix and suppose there exists a basis v1 , . . . , vn for Rn such that for each i, Avi = λi vi for some scalar λ. (I. e. vi is an eigenvector for

Eigenvectors and Diagonalizing Matrices E. L. Lady Let A be an n × n matrix and suppose there exists a basis v1 , . . . , vn for Rn such that for each i, Avi = λi vi for some scalar λ. (I. e. vi is an eigenvector for

Add to Reading List

Source URL: www.math.hawaii.edu

Language: English - Date: 2001-04-07 05:53:14
17Recursive Orthogonal Least Squares Learning with Automatic Weight Selection for Gaussian Neural Networks Meng H. Fun, [removed], Oklahoma State University Martin T. Hagan, [removed], Oklahoma S

Recursive Orthogonal Least Squares Learning with Automatic Weight Selection for Gaussian Neural Networks Meng H. Fun, [removed], Oklahoma State University Martin T. Hagan, [removed], Oklahoma S

Add to Reading List

Source URL: hagan.ecen.ceat.okstate.edu

Language: English - Date: 2008-01-23 11:12:51
18Recursive Orthogonal Least Squares Learning with Automatic Weight Selection for Gaussian Neural Networks Meng H. Fun, [removed], Oklahoma State University Martin T. Hagan, [removed], Oklahoma S

Recursive Orthogonal Least Squares Learning with Automatic Weight Selection for Gaussian Neural Networks Meng H. Fun, [removed], Oklahoma State University Martin T. Hagan, [removed], Oklahoma S

Add to Reading List

Source URL: hagan.okstate.edu

Language: English - Date: 2008-01-23 11:12:51
19Index 2 × 2 determinant, 71 algorithm, Gauss-Jordan, 8 angle between vectors, 166 asymptotes, 137 basis, left-to-right algorithm, 62

Index 2 × 2 determinant, 71 algorithm, Gauss-Jordan, 8 angle between vectors, 166 asymptotes, 137 basis, left-to-right algorithm, 62

Add to Reading List

Source URL: www.numbertheory.org

Language: English - Date: 2010-02-10 05:25:58
20© 2008 OSA: COTA/ICQI/IPNRA/SL a296_1.pdf CWA4.pdf The FWM Impairment in Coherent OFDM Compounds on a Phased-Array Basis

© 2008 OSA: COTA/ICQI/IPNRA/SL a296_1.pdf CWA4.pdf The FWM Impairment in Coherent OFDM Compounds on a Phased-Array Basis

Add to Reading List

Source URL: www.celight.com

Language: English - Date: 2014-01-01 15:39:35